|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of
Theoretical and Applied Information Technology
November 2017 | Vol. 95
No.21 |
Title: |
USING AN ANT COLONY OPTIMIZATION ALGORITHM FOR IMAGE EDGE DETECTION AS A
THRESHOLD SEGMENTATION FOR OCR SYSTEM |
Author: |
MARYAM ASGARI, FARSHID PIRAHANSIAH, MOHAMMAD SHAHVERDY, MEHDI FARTASH |
Abstract: |
Binarization or thresholding is one problem that have to solve in pattern
recognition methods and applications. Moreover, it has a very important
influence on the sequent steps in computer vision applications such as, Optical
Character Recognition (OCR), image segmentation, and tracking objects. Ant
colony optimization (ACO) is a population-based metaheuristic which use to solve
optimizations problems in diverse fields, such as traffic congestion and
control, structural optimization, manufacturing, and genomics are presented. In
this work, a combination of ant colony, edge detection, and thresholding methods
are combine in order to use in OCR system. The algorithm which is used the DIBCO
2009 in printed and a handwritten image was tested. This method has a compare
with Kittler and Illingworth's Minimum Error Thresholding, potential difference,
max entropy, Pirahansiahs method and Otsu. |
Keywords: |
Ant Colony Optimization (ACO), Peak Signal-To-Noise Ratio (PSNR), Single
Thresholding, Image Processing, Image Segmentation, Optical Character
Recognition, Computer Vision, Deep Learning. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
AN EFFICIENT HECTIC COMPOSITION WEB SEQUENTIAL BASED PATTERN TREES IN LARGE
DATABASES |
Author: |
A.P.SELVA PRABHU, DR.T.RAVICHANDRAN |
Abstract: |
Web sequential pattern mining identifies frequent subsequences as patterns from
large database. In this paper, a novel framework called Hectic Composition
Mining based Approximate Pattern Tree (HCM-APT) to handle different access
pattern by linking operation is presented. This framework extends the tree based
indexing model, which dynamically adjusts links in the mining process using
Composite Pattern Mining. A distinct feature of HCM-APT framework is that it
clusters on a very limited and precisely predictable space which runs fast in
memory based setting. As a result, the framework HCM-APT scales up to very large
database through database segregation extensively minimizing the memory space.
For dense base, competent Approximate Pattern Trees are constructed dynamically
for obtaining rich properties by significantly reducing the execution time for
obtaining rich properties. Finally, the proposed framework applies a scalable
mining model for approximate patterns generated through tree using Variable
Regression function for improving the scalability in mining large databases.
Experimental results on Amazon Commerce reviews dataset show the proposed
framework HCM-APT outperform other well-established methods in identifying
hectic composition pattern. Experiment is conducted on factors such as execution
time for obtaining rich properties, memory space consumption and scalability to
mine the sequential patterns effectively. |
Keywords: |
Web Sequential Pattern Mining, Bidirectional Pattern, Access Patterns, Composite
Pattern Mining, Approximate Pattern Trees, Variable Regression Function |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
ACCELERATING THE OUTLIER DETECTION METHODS FOR CATEGORICAL DATA BY USING MATRIX
OF ATTRIBUTE VALUE FREQUENCY |
Author: |
NUR ROKHMAN, SUBANAR, EDI WINARKO |
Abstract: |
Based on the data, outlier detection methods can be classified into three
classes. Those are the methods which work on numerical data, work on categorical
data, and work on mixed type data. Most of the outlier detection method works on
numerical data. Only few method works on categorical data or work on mixed type
data. In this paper, a new method for detecting outlier in categorical data
called Weighted Matrix Entropy Value Frequency (WMEVF) has been proposed. This
method uses weighting function to improve the precision and uses a matrix of
attribute value frequency to reduce the complexity. There are four weighting
functions used in the experiments namely: range, variations, deviation standard,
and square function. The performance of WMEVF is observed based on the
detected outlier of UCI Machine Learning datasets and the time needed to detect
the outlier. The experiments show the fact that square function improved the
precision and the matrix of attribute value frequency reduced the complexcity
from O(m*n2) to O(m*n). |
Keywords: |
Outlier Detection, Categorical Data, Weighting Function, Entropy, Attribute
Value Frequency. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
Q-LEARNING WITH ADAPTIVE KANERVA CODING ON PROTEIN DOCKING |
Author: |
ERZAM MARLISAH, RAZALI YAAKOB, MD NASIR SULAIMAN, MOHD BASYARUDDIN ABDUL RAHMAN |
Abstract: |
Molecular docking is an important process in pharmaceutical research and drug
design. It is used in screening libraries of small molecules or ligands to bind
to a target protein changing its original biochemical properties forming new
stable complex. In docking ligand to protein, the ligands pose, i.e. position,
orientation and torsion angles, is translated, orientated, and the ligands
torsion angles are rotated repeatedly to find an ideal site on the protein to
bind. In this paper, Q-learning algorithm, a model-free reinforcement learning,
with adaptive Kanerva Coding is used as the searching algorithm for
protein-ligand docking problem. It evaluates the effectiveness of Q-learning
algorithm and the different settings for the parameters of reinforcement
learning. A popular docking tool called AutoDock Vina was used to find the
ligands goal pose. The effectiveness of the agent is measured by the success of
finding the goals. The proposed agent managed to match and finds better pose
than AutoDock Vina in medium to large size ligands. |
Keywords: |
Reinforcement Learning, Q-learning, Protein-Ligand Docking, Machine Learning,
Kanerva Coding |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
EFFECTIVENESS OF K-MEANS CLUSTERING TO DISTRIBUTE TRAINING DATA AND TESTING DATA
ON K-NEAREST NEIGHBOR CLASSIFICATION |
Author: |
MUSTAKIM |
Abstract: |
One of the constraints in classification is how to divide the dataset into two
parts, training and testing which can represent every data distribution. The
most commonly used technique is K-Fold Cross Validation which divides data into
several parts and alternately into training data and testing data. In addition,
the commonly used technique is to divide data into percentage form (70% and
30%), also become an option in data mining research. K-Means is a grouping
algorithm which able to maximizes the effectiveness of distributing data in
classification. The experiments performed using K-Means Clustering against
K-Nearest Neighbor (K-NN) which was validated by Confusion Matrix have the
highest accuracy of 93.4%, it is higher than the K-Fold Cross Validation data
distribution technique for each experiment using data Education Management
Information System (EMIS) as well as random data. The concept of distributing
data in groups can be a representative to each member and increase the accuracy
of classification algorithm, although the experiment only applied 70% of
training data and 30% of testing data in each group. |
Keywords: |
Confusion Matrix, K-Fold Cross Validation, K-Means Clustering, K-Nearest
Neighbor |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
A ROBUST DUAL SOURCE LEVEL SET METHOD FOR THREE-DIMENSIONAL ECHOCARDIOGRAPHY
IMAGE SEGMENTATION |
Author: |
NIMA SAHBA, EMAD FATEMIZADEH, HAMID BEHNAM |
Abstract: |
Echocardiography as a common place device has been widely used for diagnosis.
Segmentation of left ventricle just using the standard level set formulation
based on the magnitude of intensity cannot restrict the contour evolution
completely due to the lack of sharp edges. This paper aims to introduce a
robust method for segmentation of three-dimensional echocardiography images
based on two boundary maps namely distance map and probabilistic map as stopping
criteria of contour evolution which are extracted using two different
approaches. The distance map is extracted by applying Free Form Deformation
(FFD) method on manually determined landmarks of initial spherical volume to
register the landmarks to boundary of left ventricle using B-Spline
interpolation. The probabilistic map is independently created as a contour
guideline to achieve probable desirable regions. Segmentation of
echocardiography volumes is implemented using a proposed energy function based
on level set formulation combined by two edge maps which is named dual source
level set. Left ventricle segmentation using proposed method illustrates
expert-approved performance, providing a reliable tool for clinical practice due
to less than 2% ejection fraction error compared to manual tracing by expert.
Making use of combined distance and probabilistic map as stopping criterion in
level set approach helps to avoid the local minima and excessive contour
expansion, while the left ventricle valve is open or there is not clear edge. |
Keywords: |
Level set, Segmentation, three-dimensional echocardiography, Dual Source. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
MODELLING OF ASPECTS USING ASPECT-ORIENTED DESIGN LANGUAGE |
Author: |
SAQIB IQBAL, ABDALLA MANSUR, GARY ALLEN |
Abstract: |
The Aspect-composition is a vital step in aspect modelling. Aspects are composed
with each other and with base constructs through pointcuts defined in the
aspects. Design languages address this composition by providing composition
techniques and directives. However, most of the contemporary design languages
lack support for inter-aspect and inner-aspect compositions. Another problem is
resolving aspect interference which arises as a result of a composition.
Although some techniques have been proposed to overcome aspect interference at
the implementation level, the problem needs attention at the modelling level.
The eradication of interference and conflicts related to aspect composition at
the modelling stage could ensure better implementation and fewer conflicts. This
paper provides a composition strategy equipped with new design notations and
diagrams to provide support for aspect compositions, as well as inner-aspect
compositions. The paper also provides a technique to prioritize aspect execution
at the modelling stage to reduce aspect interference and aspect conflicts. |
Keywords: |
Aspect-Oriented Programming, Pointcut Modelling, Aspect Composition,
Aspect-Oriented Model |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
SUSTAINABLE SOFTWARE DEVELOPMENT LIFE CYCLE PROCESS MODEL BASED ON CAPABILITY
MATURITY MODEL INTEGRATION: A STUDY IN MALAYSIA |
Author: |
KOMEIL RAISIAN, JAMAIAH YAHAYA, AZIZ DERAMAN |
Abstract: |
Green software engineering is an important view of software engineering process
in the current century. Previously, software engineers concerned mainly with
development of hardware or software without giving more importance to
sustainability. The field of green software and green software engineering is
still young. Thus, in modern society, researches efforts are mainly focused on
green and sustainable software engineering itself. The development of
sustainable software has been identified as one of the key challenges in the
field of computational science and software engineering and also there no is
clear idea regarding how to accomplish the Green and sustainability in Software
Development Life Cycle (SDLC) stage. However, the aim of this study is the
adjustments in the current SDLC and presented sustainable SDLC. In order to
access the goal, the study firstly is used Standard Software development process
method which is Capability Maturity Model Integration (CMMI) to
institutionalization the model. Next, the research is used survey sample
population is selected equal with 102 respondents that is Non-probability or not
randomly obtained from international software organizations in Malaysia to
identify main stages of the Green Software Development Life Cycle and finally is
investigated based on that. |
Keywords: |
Software Development Life Cycle (SDLC), Green and sustainable software,
Capability Maturity Model Integration (CMMI), Model |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
PREDICTION MODEL DEVELOPMENT FOR ONSHORE CRUDE OIL PRODUCTION BASED ON OFFSHORE
PRODUCTION USING ARTIFICIAL NEURAL NETWORK |
Author: |
MARYONO, SUHARTONO |
Abstract: |
An old fact of the oil and gas industry is the difference in daily net
production figures between onshore and offshore. This study aims to provide a
model to reduce the difference, by processing offshore data to predict daily
onshore production. The method used is regression and artificial neural network.
The data used are daily data of offshore production and BSW (base sediment and
water) number in 2013 until 2016. The results show that ANN shows better
prediction compared with regression. From the observations made, the best period
for modeling is the windowing of end period of 60 days with the tanh activation
function and the standardized preprocessing method on ANN. For further research
it is worth considering to include temperature as a prediction variable. |
Keywords: |
Crude Oil Production Figures, Prediction Model, Windowing, Regression,
Artificial Neural Network. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
IMPLEMENTATION OF A VEHICLE DETECTION SYSTEM IN THE FPGA EMBEDDED PLATFORM |
Author: |
ATIBI MOHAMED, BENRABH MOHAMED, ATOUF ISSAM, BOUSSAA MOHAMED, BENNIS ABDELLATIF |
Abstract: |
This document presents the implementation of a vehicle detection system in the
FPGA platform. This system is based on two algorithms, an image processing
algorithm that combines an algorithm for detecting areas of interest through the
shadow of the vehicle, and an image descriptor (suspected zones) haar like
features type, And another classification algorithm named artificial neural
network which aims to detect the presence of the vehicles in these zones. To
evaluate the results obtained, which showed that the proposed system is a fast
and robust vehicle detector, a hardware implementation was performed in the FPGA
embedded platform based on the NIOS II microprocessor and its input-output
devices. The results of this implementation have shown that the FPGA platform
has been able to maintain the performance of this system in terms of
computational efficiency and speed, which confers on this system being a
real-time vehicle detection system. |
Keywords: |
Vehicle Detection, Haar Like Features, Artificial Neural Network, FPGA, NIOS II,
Real Time |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
AVERAGE AND MAXIMUM WEIGHTS IN WEIGHTED ROTATION- AND SCALE-INVARIANT LBP FOR
CLASSIFICATION OF MANGO LEAVES |
Author: |
EKO PRASETYO , R. DIMAS ADITYO , NANIK SUCIATI , CHASTINE FATICHAH |
Abstract: |
The texture features would be important part when we conduct image
classification. Local Binary Pattern (LBP) is one of feature extraction method
that has most improvements by many researchers. Weighted Rotation- and
Scale-invariant LBP (WRSI-LBP) is one of improvement versions. It uses minimum
magnitude of local differences as an adaptive weight (WRSI-LBP-min) to adjust
the contribution of LBP code in histogram calculation. The motivation is minimum
magnitude gives minimum distortion to change LBP code in histogram calculation.
In the classification of mango leaves case, the texture characteristic of mango
leaves is highly difficult to be differed directly. So, for high accuracy
detection, system requires texture feature with strength discrimination
character, robust to illumination change, not sensitive to scaling and rotation.
To achieve the goal, we propose average and maximum of magnitude of local
differences as an adaptive weight of WRSI-LBP (WRSI-LBP-avg and WRSI-LBP-max).
This scheme can be used to generate texture features for classification of mango
leaves and general classification cases. The motivation of average weight is to
cover all local different magnitude, because each LBP code generated would has
unique neighbors pattern. The motivation of maximum is it gives maximum
distortion to change LBP code, but it gives highest local different magnitude.
We use Support Vector Machine (SVM) and K-Nearest Neighbor (K-NN) as
classification methods. We use 240 images for performance evaluation, contains
three varieties: Gadung, Jiwo and Manalagi. The K-Fold Cross Validation and
Leave-One-Out are used as validation method. From the experiments show that
WRSI-LBP-avg and WRSI-LBP-max achieve the highest accuracy compare to
WRSI-LBP-min, LBP, Center Symmetric LBP (CS-LBP) and Dominant Rotated Local
Binary Pattern (DRLBP). SVM achieve accuracy 75.21% with 16 bins, while K-NN
achieve accuracy 79.17% with 256 bins. For uniform pattern, we apply experiments
to WRSI-LBP-min, WRSI-LBP-avg, and WRSI-LBP-max. The highest accuracy is also
achieved by WRSI-LBP-avg and WRSI-LBP-max. |
Keywords: |
Texture, Local Binary Pattern, Mango Leaves Classification, Rotation Invariant,
Scale Invariant |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
SOURCE CODE ANALYSIS EXTRACTIVE APPROACH TO GENERATE TEXTUAL SUMMARY |
Author: |
KAREEM ABBAS DAWOOD, KHAIRONI YATIM SHARIF, KOH TIENG WEI |
Abstract: |
Nowadays, obtain program features becomes a hot issue in source code
comprehension. A large amount of efforts spent on source code understanding and
comprehension to develop or maintain it. As a matter of fact, developers need a
solution to rapidly detect which program functional need to revise. Hence, many
studies in this field are concentrating on text mining techniques to take out
the data by source code analysis and generate a code summary. However, in this
paper, we attempt to overcome this problem by propose a new approach (Abstract
Syntax Tree with predefined natural language text Template (AST-W-PDT)) to
generates human readable summaries for Java methods role. This paper describes
how we developed a tool that the java source code can be summarized from the
methods role. In evaluating our approach, we found that the automatically
generated summary from a java class 1) is helpful to the developers in order to
understand the role of the methods and will be useful, and 2) the automatically
generated summary is precise. |
Keywords: |
Source Code Summarization, Program Comprehension, Source Code Maintenance,
Abstract Syntax Tree |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
DEVELOPING OF THE CYBER SECURITY SYSTEM BASED ON CLUSTERING AND FORMATION OF
CONTROL DEVIATION SIGNS |
Author: |
LAKHNO V. A., KRAVCHUK P. U, MALYUKOV V. P., DOMRACHEV V. N., MYRUTENKO L.V.,
PIVEN O. S. |
Abstract: |
The cyber security (CS) adaptive system is developed. It is based on advanced
algorithms of anomalies signs space partitioning and attacks on clusters. A new
approach of solving the topical scientific and applied problem of increasing the
efficiency of systems of intelligent recognition of cyber attacks and anomalies
is proposed. Unlike the existing ones, the present approach allows to take into
account the modern statistical and remote parameters of the clustering of the
attributes of cyber attacks and provides the opportunity to change the valid
tolerance deviations for all the attributes simultaneously, as well as quickly
identify new types of complex combined attacks with limited computing resources
and variability of conditions. Unlike existing algorithms, the advanced ones
enable to take into account the subject area peculiarities, including legal
characteristics of cyber crimes of space signs construction. Among PTC Mathcad
Prime 4.0, MATLAB (Simulink), using established simulations, the performance of
the proposed algorithms are tested in CS systems of various companies. |
Keywords: |
Clustering Features, Cyber Security, Simulation Experiment, Test Tolerances |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
EXPERIMENTAL RESULTS ON MULTI-KEY SEARCHABLE ENCRYPTION TECHNIQUE WITH DIFFERENT
ELLIPTIC CURVES AND APP DESIGNING |
Author: |
PUTTA SRIVANI, SIRANDAS RAMACHANDRAM, RANGU SRIDEVI |
Abstract: |
Multi-Key searchable encryption scheme is a technique implemented to perform the
keyword search on cipher text. This scheme can be practically applied for client
server applications to achieve data confidentiality and allows the server to
perform the operation like search on the cipher data. Experimental results of
multi-key searchable encryption scheme implemented for different types of
elliptic curves are shown in this paper and results are compared between
different types of elliptic curves. An application also designed with Java as
frontend and MongoDB as backend. It also shows the search time taken to perform
the search operation on the encrypted data by using this scheme. |
Keywords: |
Encryption, Search Token, Delta, Token, Cloud, Multi-key |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
EVALUATION OF THE IT OUTSOURCING PERFORMANCE IN THE DEVELOPMENT OF BIGDATA
SYSTEMS |
Author: |
ELENA N. FOKINA, FEDOR YA. LEGOTIN, VERONIKA YU. CHERNOVA |
Abstract: |
The article considers the issue of outsourcing development in the field of
information technologies. The work is devoted to the construction of a model
that allows optimizing the organizational structure of the BigData information
network and carrying out a comparative analysis of the effectiveness of its
implementation in the format of IT outsourcing. In the process of research, the
theoretical bases of outsourcing are analyzed and factors influencing the choice
of the system implementation variant are generalized. In the critical review, a
hypothetical assumption is made that the degree of IT outsourcing implementation
effectiveness is inversely related to the degree of change in the
characteristics of uncertainty, the frequency of queries and specificity. The
study made it possible to characterize a set of indicators for evaluating the
efficiency of outsourcing the BigData information system. The development model
can be used to optimize the server complex of the network in order to reduce the
number of cluster computing capacities involved during variable load. The
criteria of the expediency of outsourcing application in practical activity of
the companies are determined. |
Keywords: |
Bigdata, IT Outsourcing, Information Uncertainty, Information Network
Development |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
MP4 VIDEO STEGANOGRAPHY USING LEAST SIGNIFICANT BIT (LSB) SUBSTITUTION AND
ADVANCED ENCRYPTION STANDARD (AES) |
Author: |
PUTU ARI SRI LESTARI EKA NINGSIH, GUSTI MADE ARYA SASMITA, NI MADE IKA MARINI
MANDENNI |
Abstract: |
Data transmission over the internet is vulnerable for attacked, therefore
additional data security is required. Steganography is one of security protocol
which used for protect the data. This research applied steganography using MP4
video as the host (cover media) and JPEG image as the embedded data. An AES-128
bit encryption was applied to the image before it embedded into the video.
Encrypted image then stored bit-per-bit into an array. Those bits array then
were embedded into the video by modifying LSB of the bytes of video. The image
was embedded in the first still-encoded sample of the video. Sample was found by
using atoms of MP4 container, those were stco/co64, stsc, stsz, stts dan mdat.
Those atoms stored metadata of the video, thus could be used for mapping
position of all the samples. The LSB substitution process did not touched the
color spaces of the video, thus video quality can not measured using Peak
Signal-to-Noise Ratio (PSNR) value. Video quality measured by a software called
Qualify and the stego-video is scored 80 for its quality. That value indicated
stego-video had an excellent quality. Stego-video could played well at all the
common video player on any devices and there was no quality degradations or any
changed as seen by human eyes. This research could contribute on increased the
security of an image data when stored at the cloud storage or being transmitted
over a network. |
Keywords: |
Steganography, Video, Sample, LSB, AES |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
VERIFICATION OF VHDL DESCRIPTIONS OF PARALLEL ARRAYS OF FINITE STATE MACHINES |
Author: |
NIKOLAY ALEKSANDROVICH AVDEEV , PETR NIKOLAEVICH BIBILO , VLADIMIR VLADIMIROVICH
KOROBKIN , ANNA EVGENIEVNA KOLODENKOVA |
Abstract: |
Finite state machines are widely applied in the development of digital systems
for description of control logic nodes, microprocessors, interface circuits and
so on. This work proposes verification procedure of VHDL description of parallel
arrays of finite state machines in Questa Sim simulation system. The main
advantage of Questa Sim is that the model of finite state machine (FSM) can be
verified if its written according to certain template. Verification is comprised
of validating for compliance of VHDL description of finites state machine array
with design specifications. The method utilizes the capabilities of the Questa
Sim system, which makes it possible to identify the oriented graphs of the
transitions of the component machines and to calculate the number of the arc
passings in the graphs based on the results of simulation. However, the Questa
Sim system does not recognize the FSM network and does not have the means to
construct the tests based on the simulation results. Therefore, to solve these
problems, it is suggested to store the simulation results, the sequence of
input sets (stimuli) and the state tuples of the component machines, and to
check the execution of transitions in the state graph of the machine network
based on the sequences obtained, and, thus, to conduct the verification. In
addition, this article discusses an example of description of FSM and FSM arrays
using VHDL. |
Keywords: |
Digital Systems, VHDL Descriptions, Verification, Simulation, Finite State
Machines. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
AUTOMATIC CHRONOLOGICAL ORDERING OF AUDIO DATA USING SPECTOGRAMS |
Author: |
ALEXANDER ALFIMTSEV, SVETLANA NAZAROVA, Xiao Zelong |
Abstract: |
An automatic quantitative method for speech fragment analysis and chronological
ordering is proposed. The method works by first converting the audio data into
two-dimensional spectrograms and then extracting a large set of 1030 numerical
descriptors (features) from the raw spectrograms, as well as from transforming
the spectrograms. The audio fragments similarity value is computed using a
variation of the Weighted K-Nearest Neighbour scheme. The similarity tree is
then used to visualize the differences between the speech fragments. The
accuracy of the method depends on the size of the feature set for the analysis
and the length of the audio files. The speech fragments of the well-known
politicians Vladimir Putin, Barak Obama, Angela Merkel, Jacques Chirac, George
Bush and Vladimir Zhirinovsky were used for the analysis. The experimental
results show that the method was able to create a chronological ordering of the
speech fragments and that the most significant features for the analysis of
audio data are: histograms of fuzzy-oriented gradients, multiscale histograms,
and combinations of geometric moments. |
Keywords: |
Audio analysis, Chronological ordering, Fuzzy feature, Speech fragment,
Two-dimensional spectrogram |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
MODELING PERFORMANCE MANAGEMENT OVER CORPORATE INFORMATION SYSTEM OPERABILITY |
Author: |
WALDEMAR WOJCIK, NAIZABAYEVA LYAZAT, ORAZBEKOV ZHASSULAN |
Abstract: |
The article herein solves the system task of controlling the distributed
computer system operability. We are studying the speed performance and
enterprises and organizations Corporate information system properties,
representing complicated organizational technical complexes. Hereby there is
considered the reaching the ways of intermittent and continuous speed monitoring
and that of the stream volume, handled with Corporate information systems
subsystems, modeling the current Corporate information systems components
functionality. There is constructed the control conceptual model with the
corporate information systems components functionality capacity, supporting the
corporations business processes. There is presented the organizations
corporate information system model. Organizations corporate information system
possesses the possibility of the information systems content automatic
adjustment to different users categories, being the representatives of the
organizations various structural subdivisions. |
Keywords: |
Data Models, Corporate Information System, Business Process, Server, Database
Control System (DCS) |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
THE DEVELOPMENT OF INTELLIGENT SYSTEMS FOR SOLAR PANELS STATION AND METHODS FOR
DETERMINING THE PRECISION OF THE SOFTWARE TRACKING SYSTEMS , THE USE OF WIRELESS
COMMUNICATION DEVICE |
Author: |
SATYBALDIYEVA F.A., BEYSEMBEKOVA R.N., SARYBAEV A.S., ESENBEKOVA G.J. |
Abstract: |
Use of wireless communication between detectors, transducers and industrial
logical controllers in modern optical SPS heliostat control systems is more
advantageous than laying hundreds meters of cable. To provide power supply,
it is proposed to equip each heliostat with a self-contained power supply, since
heliostat operates when concentrated solar radiation in the receiver is
sufficient for steam generation, while the rest of the time it is in the standby
mode. That is why use of a solar battery-powered self-contained power supply is
more advantageous than use of centralized power supply from the industrial
network. As distinct from other measuring informational systems, the described
heliostat control system operates only when tracking parameters deviate towards
the maximum permitted values. Presented in article the model of system of
automatic control of the heliostats constitutes one of the varieties of
measurement and information and control systems. Unlike other measurement
systems described control system of the heliostats works only when the deviation
of the tracking parameters in the direction of the maximum. The use of modern
control systems, heliostats optical system SES wireless connections between
sensors, transducers and industrial logic controllers is advantageous compared
to laying hundreds of meters of cables. |
Keywords: |
Intelligent Control Systems, Mathematical Model,, Heliostat, Wireless
Communication, Power Supply. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
INNOVATIVE PARADIGM OF EDUCATION OF KNOWLEDGE COMPETENCY FORM BASED ON
ONTOLOGY |
Author: |
KUBEKOV B.S., BEYER D., UTEGENOVA А.U., ZHAKSYBAEVA N.N. |
Abstract: |
The article describes an innovative methodology for shaping educational
components of a planned training, based on ontology of reference concepts of
educational content and an algebraic model of knowledge mapping - expression of
knowledge on the basis of which subjects of the foundation and major-specific
courses curriculum should be designed. The main purpose of the proposed
methodology is to create knowledgeable content of the semantic database to
design teaching and methodological materials for courses curriculum. |
Keywords: |
The Ontology Model, Characteristic Model, Concept, Specifying Concept,
Consistency And Variability, A Composition Relations, Aggregation Relations, An
Alternative Choice Relations, The Concept Of Paradigm, Specification Of
Knowledge Expressions, Graph Models |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
STUDYING NETWORK TRAFFIC USING NONLINEAR DYNAMICS METHODS |
Author: |
SHARAFAT A. MIRZAKULOVA, VYACHESLAV P. SHUVALOV, ALEKSEY A. MEKLER |
Abstract: |
The development of the Internet, the constantly growing number of network users,
and their mutual exchange of information is becoming an important communication
bridge. However, this causes a series of technical difficulties, one of which is
the growing requirements to network and server equipment and its maintenance.
Therefore, the purpose of this study is to develop a computer program for
teaching a neural network based on a computer network traffic table. A set of
methods was used to achieve the set goal, including analysis, deterministic
chaos, and systematization. The study used such software packages as TISEAN,
MatLab, NetEmul, and Excel. The study generalized the experience that was
relevant to the problem at hand. The study calculated the Lyapunov exponent,
which characterizes the presence of chaos in the system. Analysis of the
Lyapunov exponent enables using nonlinear dynamics methods to study the nature
of the incoming and outbound traffic. With the help of the developed program,
the neural network router is capable of prediction short-term parameters of a
computer network; this information is sent to the system administrator, which
will allow adapting the router to the estimated changes in the computer network. |
Keywords: |
Distribution series, Self-similarity, Chaotic processes, Phase portrait, Mutual
information |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
SPAM DETECTION ISSUES AND SPAM IDENTIFICATION OF FAKE PROFILES ON SOCIAL
NETWORKS |
Author: |
BALOGUN ABIODUN KAMORU, AZMI BIN JAAFAR OMAR, MARZANAH A.JABAR, MASRAH AZRIFAH
AZMI MURAD, ABDULMAJID B. UMAR |
Abstract: |
Spam has been a major and global threat, Social networks have become our daily
live and everyday tools, while different social networks have different target
groups. With the rapid growth of social networks, people tend to misuse them for
unethical and illegal conducts, fraud and phishing. Creation of a fake profile
becomes such adversary effect which is difficult to identify without appropriate
research. The current solutions that have been practically developed and
theorized to solve this issue of spam detection issue and spam identification of
fake profiles, primarily considered the characteristics and the social network
ties of the user social profile. However, when it comes to social networks like
Facebook, Twitter, SinaWeibo, Myspace, Tagged and LinkedIn such a behavioural
observations are highly restrictive in publicly available profile data for the
users by the privacy policies. The limited publicly available profile data of
social networks makes it ineligible in applying the existing approaches and
techniques in fake profile spam identification. Therefore, there is a need to
conduct targeted research on identifying approaches for fake profile spam
identification on selected and available data set of Facebook, Twitter and Sina
weibo. In this research, we identify the minimal set of profile data that are
necessary for identifying Fake profiles in Facebook, Twitter and Sina weibo and
identifying the appropriate data mining approach and techniques for such task.
We demonstrate that with limited profile data our approach can identify the fake
profile with 84 % accuracy and 2.44 % false negative, which is comparable to the
results obtained by other existing approaches based on the larger data set and
more profile information. |
Keywords: |
Social Networks, Fake Profile, Spam Detection, Principle Component Analysis,
Spam Identification |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
A RE-CONSTRUCTIVE ALGORITHM TO IMPROVE IMAGE RECOVERY IN COMPRESSED SENSING |
Author: |
R.JAYA LAKSHMI |
Abstract: |
In this paper, we study the Compressed Sensing (CS) image recovery problem. The
traditional method divides the image into blocks and treats each block as an
independent sub-CS recovery task. This often results in losing global structure
of an image. In order to improve the CS recovery result, we propose a nonlocal
estimation step after the initial CS recovery for de-noising purpose. The
nonlocal estimation is based on the well-known nonlocal means (NL) filtering
that takes advantage of self-similarity in images. We formulate the nonlocal
estimation as the low-rank matrix approximation problem where the low-rank
matrix is formed by the nonlocal similarity patches. An efficient algorithm,
Extended NonLocal Douglas-Rachford (E-NDLR), based on Douglas-Rachford splitting
is developed to solve this low-rank optimization problem constrained by the CS
measurements. Experimental results demonstrate that the proposed E-NDLR
algorithm achieves significant performance improvements over the
state-of-the-art in CS image recovery. |
Keywords: |
Reconstruction, Algorithm, Recovery, Image, Compressed Sensing |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
LARGE SCALE SENSOR DATA PROCESSING BASED ON DEEP STACKED AUTOENCODER NETWORK |
Author: |
NAGWA ELARABY, MOHAMMED ELMOGY, SHERIF BARAKAT |
Abstract: |
Recently, Internet of Things (IoT) extremely populated by massive amounts of
connected embedded devices, which are gathering large volumes of real-time
heterogeneous data. Hence, IoT becomes an archetypal instance of Big Data. The
collected IoT Big Data may not be profitable unless we evaluate and accurately
exploit them. Providing mining for large scales of raw sensor data is an open
challenge. To cope with this challenge, we proposed a system that operates in
two modes, which are preparation and processing. The preparation mode converges
on reducing the factors that hinder making efficient processing by focusing on
three stages. First, handling missing data by applying interval-valued
fuzzy-rough feature selection methodology. It highlights the most important
features that contain missing data and gets rid of the others. Then, Maximum
Likelihood (ML) approach is used for estimating the missing values. Second,
anomalies are detected by initially utilizing K-nearest neighbors (KNN)
algorithm then removing the detected ones from the data. Third, the
dimensionality of nonlinearly separable data is reduced by exploiting
Self-Organizing Map (SOM) network. In the processing mode, we passed the
prepared data to a straightforward classifier based on a Deep Learning (DL)
approach. We used autoencoder networks in constructing a deep network, which is
the Deep Stacked Autoencoder (DSAE). The extracted features by the DSAE are
non-handcrafted and task dependent, which gives it the most discriminative power
to work as an efficient classifier. We apply the proposed model to PAMAP2
Physical Activity Monitoring data set. The results show that DSAE achieves high
accuracy (99.8%) compared to the state-of-the-art classifiers. |
Keywords: |
Internet of Things (IoT), Big Data, Deep Learning (DL), Deep Stacked Autoencoder
(DSAE) |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
CATEGORY EFFECT OF KEYWORDS ON HEALTHCARE INDUSTRY BY TEXT MINING IN WEB |
Author: |
JAE-WON HONG, JAE-YOUNG MOON |
Abstract: |
This study is intended to investigate the effects of keywords by category on the
diffusion of consumers interest in healthcare industry focus on tourism using
data mining in web environment. And the results are as follows: first, in the
case of health care-related keywords, search activities for the categories of
finance, biz, book, and social categories are found to affect the diffusion of
medical tourism. Second, in the case of the physical examination keywords,
search activities for the categories, such as Shopping, Travel, Social,
community, are found to have some effect on the diffusion of medical tourism.
Third, the role played by key words that affect the diffusion of medical tourism
is appeared to differ depending on their categories. That is, for the part of
health care, the categories of premeditation attribute, such as, finance, book,
biz, travel, found to have impact on the diffusion of medical tourism, while in
the case of physical examination, categories of impulsion, such as shopping,
sports, food, are found to affect the diffusion of medical tourism. In other
words, it is implied that the attributes of keywords and categories should be
considered in communicating for the diffusion of medical tourism in web
environment. |
Keywords: |
Keyword Analysis, Text Mining, Healthcare, Tourism, Web Environment |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
THE STUDY ON THE EFFECT OF CONTROL FACTORS FOR JOINT VENTURE SHARING INFORMATION
USING ENTERPRISE RESOURSE PLANNING SYSTEM |
Author: |
HYO-KYUNG KIM, WON-HEE LEE |
Abstract: |
This empirical study is to investigate the effects of commitments (calculative
commitment, affective commitment) and control types (output, process, and
social), which are relational characteristics between partners in international
joint venture, on corporate performance. Today almost companies have adaptive
their own ERP system. When companies which using ERP system want to delivery
information from cooperate companies without information bias. Especially,
companies which have cooperate companies concern about information quality
because company send misinformation to cooperate companies, they cannot make the
right decision. Joint venture is popularly spread in the world. Sharing
information is really important thus many joint ventures have built their local
ERP system and sharing information each others. According to the results of
this research, affective commitment has positive effect on all the output
control, process control, and social control. Whereas, calculative commitment
has positive effect on the output control and process control only. But neither
output control nor process control has positive effect on the results |
Keywords: |
ERP System, Information Bias, Joint Ventures, Information Quality Assurance,
Information Sharing, Affective Commitment, Calculative Commitment |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
IMPACT OF DEFAULT OPTION ON FINAL PRICE IN ONLINE COMMERCE |
Author: |
JUN SIK KWAK |
Abstract: |
As online purchases become more frequent, the e-commerce and m-commerce market
is growing rapidly. Under such circumstances, understanding the preferences of
online consumers is very important. Status quo bias and implicit self-theory of
people can give important implications for online purchase. People do not want
to change the current state. So, people prefer to maintain their current state
rather than change. This person's tendency is called the status quo bias. A
typical strategy that utilizes the tendency of people who do not like change is
the default option strategy. Default option means an option that is selected
automatically unless an alternative is specified. Online commerce experiment was
conducted to study default option effect in final purchase price. There are two
types of default option. One is the additive default option that people add
their desired items with basic item. The other is subtractive default option
that people delete items they dont want with full items. The results show that
the final purchase price is higher in subtractive default option than in
additive default option. The final purchase price is different from sex. The
final purchase price of male group is not different from between additive
default option and subtractive default option. But the final purchase price of
female group is higher in subtractive default option than in additive default
option. This result shows that female feels more losses than male. Implicit
self-theory influence on the final purchase price in default option. Especially,
in the subtractive default option, the final price of incremental theorist is
higher than that of entity theorist. Therefore, E-commerce and M-commerce
companies will have to establish effective online sales strategies that take
into account default options and implicit self-theory. |
Keywords: |
Online Purchase Psychology, Status Quo Bias, Default Option, Implicit
Self-Theory, Final Price |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
Title: |
TEXTURE RECOGNITION USING CO-OCCURRENCE MATRIX FEATURES AND NEURAL NETWORK |
Author: |
SUHAIR H. S. AL-KILIDAR, LOAY E. GEORGE |
Abstract: |
Texture Recognition is used in various pattern recognition applications and
classification texture that possess a characteristic appearance. This research
paper aims to provide an improved scheme to provide enhanced classification
decision with need to increase the procession time significantly. This research
studied the discriminating characteristics of textures by extracting them from
various texture images using Gray Level Co-occurrence Matrix (GLCM). Three
different sets of features are proposed: the first set is a simple modified
features extracted from the traditional Gray Level Co-occurrence Matrix (GLCM);
the second set uses two sub-sets of features extracted from two GLCM calculated
using two displacement values; the third way was passing the extracted set of
GLCM through Artificial Neural Network (ANN) for classification purpose. The
considered method was applied on 13 classes of textures belong to three sets
from Salzburg Texture Image Database (i.e., bark, marble and woven fabric), each
set holds 16 images per class, so the total 208 image images were tested. Each
image was separated into four bands of color component (i.e., red, green, blue,
and gray). So, the analysis work was done for 56 different set of
characteristics (14 features per band). Then calculating the average and
standard deviation and finally conducting scatter analysis for each feature to
find out the best discriminating features which can be used for attaining best
classification. The experiments showed that the results are competitive, the
test results indicated that the attained average accuracy of classification is
improved up to (99.71%) for the training set and (99.3 %) for the testing set. |
Keywords: |
Texture Recognition, characteristics, Co-occurrence Matrix (GLCM) features,
Artificial Neural Network (ANN), Features extraction. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2017 -- Vol. 95. No. 21 -- 2017 |
Full
Text |
|
|
|